Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 2.630
Filtrar
1.
Forensic Sci Int Synerg ; 8: 100465, 2024.
Artigo em Inglês | MEDLINE | ID: mdl-38596784

RESUMO

The use of collaborative exercises (CE) and proficiency tests (PT) as part of the governance programme for any forensic science laboratory has become commonplace and recommended by several international organisations. Traditionally these have been discipline-specific exercises testing a laboratory's ability in a single area of forensic science. However, the "real" world is normally more complex and, in many instances, forensic material must be examined for a number of different evidence types. This article summarises the concepts, planning, design, preparation, implementation, co-ordination and evaluation of the 2022 Multidisciplinary Collaborative Exercise (2022-MdCE) covering a range of forensic disciplines, specifically DNA, fingerprint, documents and handwriting. The exercise consisted of a questioned letter with typescript text and a signature. In addition, the letter contained a visible bloody fingermark in the area of the signature, a visible staining in the lower left-hand corner, a latent fingermark and an indented impression. The analysis of the results showed that, in the investigation of the bloody fingermark, the priority was given to the DNA examination. Some critical issues emerged in relation to the biological (DNA)/ink sampling strategies when applied before fingermark visualisation. Another outcome of the exercise has been to demonstrate the importance of indented impressions, which have been underestimated by a significant number of participants. As setters, more in-depth studies are needed to produce consistent samples. This concerns all the disciplined involved but especially DNA and fingermarks. Based on this exercise, it is believed that this approach to testing of forensic disciplines allows the analysis of good practice within the various scientific areas, as well as scrutinising the process and sequence of events for examining the material within a forensic laboratory in the best conservative way for all kind of evidences.

2.
Front Robot AI ; 11: 1303440, 2024.
Artigo em Inglês | MEDLINE | ID: mdl-38646473

RESUMO

Conventional techniques for sharing paper documents in teleconferencing tend to introduce two inconsistencies: 1) media inconsistency: a paper document is converted into a digital image on the remote site; 2) space inconsistency: a workspace deliberately inverts the partner's handwriting to make a document easy to read. In this paper, we present a novel system that eliminates these inconsistencies. The media and space inconsistencies are resolved by reproducing a real paper document on a remote site and allowing a user to handover the paper document to a remote partner across a videoconferencing display. From a series of experiments, we found that reproducing a real paper document contributes to a higher sense of information sharing. We also found that handing over a document enhances a sense of space sharing, regardless of whether the document is digital or paper-based. These findings provide insights into designing systems for sharing paper documents across distances.

3.
Polymers (Basel) ; 16(5)2024 Feb 24.
Artigo em Inglês | MEDLINE | ID: mdl-38475302

RESUMO

Paper documents are an important carrier of information related to human civilization, with the reinforcement and protection of fragile paper documents being a key aspect of their protection. This research utilized amphoteric polyvinylamine polymer as a paper reinforcement agent, strengthening the Xuan paper commonly used in paper documents. Scanning electron microscopy (SEM), Fourier transform infrared spectroscopy (FTIR), X-ray diffraction spectroscopy (XRD), X-ray photoelectron spectroscopy (XPS), solid-state 13C NMR, and other analytical methods were employed to compare the physical properties, micro-morphology, crystallinity, and aging resistance of the paper before and after reinforcement. Research was conducted on the effects of reinforcement, the aging resistance, and the effects on the fiber structure. The results indicate that polyethylenimine has a filling and bridging effect between the paper fibers. After treatment with polyethylenimine, there was a significant improvement in the folding endurance and tensile strength of the paper. Additionally, the paper maintains a good mechanical strength even after undergoing dry heat and humid aging.

4.
Clin Ther ; 46(2): 85-89, 2024 02.
Artigo em Inglês | MEDLINE | ID: mdl-38342708

RESUMO

INTRODUCTION: Emeritus Editor-in-Chief, Richard Shader, published 2 editorials in 2014 to state that Clinical Therapeutics' would no longer consider simple innovator vs generic bioequivalence studies for publication and would require a rationale for the choice of agents when submitting drug-drug interaction studies for consideration. The intervening decade of developments in this field provides an opportunity to comment on these trends. Lewis Scheiner anchors the subsequent discussion in a "Learn and Confirm" super-structure of thinking about the goals of early development of pharmaceutical agents. Subsequent experience with newer agents that are focused on immunological targets has led to a shift from the simple No Observable Effect Level (NOEL) model to the Minimal Anticipated Biological Effect Level (MABEL) model for biologically focused effects to assess pre-clinical data in guiding the selection of a starting dose for First-in-Human studies. ELEMENTS OF PHASE I STUDIES: The primary tasks of Phase I activities are to describe the pharmacokinetics (determination of absorption, distribution, metabolism, and excretion) and essential pharmacodynamics (the dose correlation with the physiological responses, plus any untoward effects, including idiosyncratic responses) keeping in mind reporting requirements. Other Phase I activities usually conducted later in the development cycle include evaluation of drug interactions with food and other pharmaceutical agents and thorough QT studies. INNOVATIONS: Phase I studies have been evolving in response to the unrelenting pressures to improve access and efficiencies in time, cost, and effort. Changes have been occurring in the characteristics of the participating populations, the starting dose, and shifts in the enrollment schedule to a more flexible, data-driven, adaptive design. ISSUES: Additional issues have gained attention in the recent past, including Phase 0/microdosing, use of Phase I studies explicitly for treatment in the case of oncological products, involvement of Data Safety Monitoring Committees especially for first-in-class molecules, and improved means of optimizing selection of candidate agents for advancement to subsequent stages of development. Of final importance is the need for greater transparency of the presently inaccessible, early development study data maintained in commercial corporate legacy databases. Taken together, these developments and innovations by a broad range of stakeholders point to continuing opportunities for clinical investigators to explore the potential of Phase I studies to contribute to their own specialties.


Assuntos
Projetos de Pesquisa , Humanos , Equivalência Terapêutica , Preparações Farmacêuticas
5.
Heliyon ; 10(3): e24979, 2024 Feb 15.
Artigo em Inglês | MEDLINE | ID: mdl-38317945

RESUMO

The tremendous increase in publications in Microfinance since 2000 has highlighted need for and importance of innovative techniques to present big data in this field in a most informative, scientific, and summarized manner. The study highlights the trends and patterns of Microfinance literature by revealing what has been done and what could be done in future. The study comprises of 1429 microfinance publications extracted from the Scopus database. The authors adopt bibliometric analysis through open software application R and network analysis techniques using Gephi AND VOS viewer software. The study adds a valuable contribution to the field of Microfinance by distinctively summarizing the important literature. It identifies global academic research trends and provides insights about trending topics, highly cited literature, authors, countries, collaboration network, word cloud, citation analysis, etc. Finally based on extensive literature survey through bibliometric analysis. The study highlights about the scope of future research in Microfinance.

7.
Data Brief ; 52: 109953, 2024 Feb.
Artigo em Inglês | MEDLINE | ID: mdl-38186736

RESUMO

This article focuses on the construction of a dataset for multilingual character recognition in Moroccan official documents. The dataset covers languages such as Arabic, French, and Tamazight and are built programmatically to ensure data diversity. It consists of sub-datasets such as Uppercase alphabet (26 classes), Lowercase alphabet (26 classes), Digits (9 classes), Arabic (28 classes), Tifinagh letters (33 classes), Symbols (14 classes), and French special characters (16 classes). The dataset construction process involves collecting representative fonts and generating multiple character images using a Python script, presenting a comprehensive variety essential for robust recognition models. Moreover, this dataset contributes to the digitization of these diverse official documents and archival papers, essential for preserving cultural heritage and enabling advanced text recognition technologies. The need for this work arises from the advancements in character recognition techniques and the significance of large-scale annotated datasets. The proposed dataset contributes to the development of robust character recognition models for practical applications.

8.
Tob Induc Dis ; 222024.
Artigo em Inglês | MEDLINE | ID: mdl-38196511

RESUMO

INTRODUCTION: Tax increases are the most effective but still the least-used tobacco control measure. The tobacco industry (TI) employs lobbying strategies to oppose the implementation of tax policies on its products. Over the past two decades, French tobacco tax policies have been characterized by a relative inconsistency. This research aims to understand why, by analyzing the arguments of French policymakers (MPs and government) between 2000 and 2020 in favor or against tax increases. METHODS: To capture parliamentary debates, we performed an advanced term search on the French National Assembly website, using the keyword 'tobacco'. The search returned 5126 available documents out of which 1106 (12.6%, 645 questions, 461 responses) covered price and taxation and were included. They were analyzed using descriptive statistics and thematic content analysis (NVivo) and were compared, when relevant, to arguments raised in the international literature on TI lobbying against taxation increases. RESULTS: We found 3176 arguments on tobacco taxation: 77.2% were against tobacco tax increases and 22.7% were in favor of tax policies. Arguments varied depending on the source: 92.4% of MPs' arguments were against tax increases, while 52.1% of arguments from government responses were in favor. The anti-tax arguments were similar to those identified in the international literature that singled out negative economic and social consequences (illicit trade, penalizing tobacconists). Other arguments that were more specific to the French context, highlighted the key economic and social role played by tobacconists in France. Pro-tax arguments highlighted the health, economic and social benefits of tax policies. CONCLUSIONS: This is the first French tobacco research on parliamentary documents, although Parliament is a place of direct TI lobbying. It will enable public health actors to better understand the arguments used by the TI in order to counter them in front of MPs, and to better monitor debates in Parliament.

9.
JMIR Hum Factors ; 11: e53378, 2024 Jan 25.
Artigo em Inglês | MEDLINE | ID: mdl-38271086

RESUMO

BACKGROUND: Adverse events refer to incidents with potential or actual harm to patients in hospitals. These events are typically documented through patient safety event (PSE) reports, which consist of detailed narratives providing contextual information on the occurrences. Accurate classification of PSE reports is crucial for patient safety monitoring. However, this process faces challenges due to inconsistencies in classifications and the sheer volume of reports. Recent advancements in text representation, particularly contextual text representation derived from transformer-based language models, offer a promising solution for more precise PSE report classification. Integrating the machine learning (ML) classifier necessitates a balance between human expertise and artificial intelligence (AI). Central to this integration is the concept of explainability, which is crucial for building trust and ensuring effective human-AI collaboration. OBJECTIVE: This study aims to investigate the efficacy of ML classifiers trained using contextual text representation in automatically classifying PSE reports. Furthermore, the study presents an interface that integrates the ML classifier with the explainability technique to facilitate human-AI collaboration for PSE report classification. METHODS: This study used a data set of 861 PSE reports from a large academic hospital's maternity units in the Southeastern United States. Various ML classifiers were trained with both static and contextual text representations of PSE reports. The trained ML classifiers were evaluated with multiclass classification metrics and the confusion matrix. The local interpretable model-agnostic explanations (LIME) technique was used to provide the rationale for the ML classifier's predictions. An interface that integrates the ML classifier with the LIME technique was designed for incident reporting systems. RESULTS: The top-performing classifier using contextual representation was able to obtain an accuracy of 75.4% (95/126) compared to an accuracy of 66.7% (84/126) by the top-performing classifier trained using static text representation. A PSE reporting interface has been designed to facilitate human-AI collaboration in PSE report classification. In this design, the ML classifier recommends the top 2 most probable event types, along with the explanations for the prediction, enabling PSE reporters and patient safety analysts to choose the most suitable one. The LIME technique showed that the classifier occasionally relies on arbitrary words for classification, emphasizing the necessity of human oversight. CONCLUSIONS: This study demonstrates that training ML classifiers with contextual text representations can significantly enhance the accuracy of PSE report classification. The interface designed in this study lays the foundation for human-AI collaboration in the classification of PSE reports. The insights gained from this research enhance the decision-making process in PSE report classification, enabling hospitals to more efficiently identify potential risks and hazards and enabling patient safety analysts to take timely actions to prevent patient harm.


Assuntos
Inteligência Artificial , Compostos de Cálcio , Óxidos , Segurança do Paciente , Feminino , Gravidez , Humanos , Algoritmos , Aprendizado de Máquina
10.
Tob Control ; 2024 Feb 13.
Artigo em Inglês | MEDLINE | ID: mdl-37137700

RESUMO

BACKGROUND: Tobacco corporation Philip Morris International launched the Foundation for a Smoke-Free World (FSFW), a purportedly independent scientific organisation, in 2017. We aimed to systematically investigate FSFW's activities and outputs, comparing these with previous industry attempts to influence science, as identified in the recently developed typology of corporate influence on science, the Science for Profit Model (SPM). DESIGN: We prospectively collected data on FSFW over a 4-year period, 2017-2021, and used document analysis to assess whether FSFW's activities mirror practices tobacco and other industries have historically used to shape science in their own interests. We used the SPM as an analytical framework, working deductively to search for use of the strategies it identifies, and inductively to search for any additional strategies. RESULTS: Marked similarities between FSFW's practices and previous corporate attempts to influence science were observed, including: producing tobacco industry-friendly research and opinion; obscuring industry involvement in science; funding third parties which denigrate science and scientists that may threaten industry profitability; and promoting tobacco industry credibility. CONCLUSIONS: Our paper identifies FSFW as a new vehicle for agnogenesis, indicating that, over 70 years since the tobacco industry began to manipulate science, efforts to protect science from its interference remain inadequate. This, combined with growing evidence that other industries are engaging in similar practices, illustrates the urgent need to develop more robust systems to protect scientific integrity.

12.
Epilepsy Behav ; 150: 109555, 2024 Jan.
Artigo em Inglês | MEDLINE | ID: mdl-38128315

RESUMO

Guidance documents play a pivotal role in shaping the management of status epilepticus (SE). However, the methodological quality of these documents remains uncertain. In this systematic review, we comprehensively searched 12 literature and guideline databases to assess the quality of clinical practice guidelines and consensus statements related to SE management using the AGREE II methodology. Additionally, we summarized the associated recommendations. We identified a total of 14 clinical practice guidelines and 11 consensus statements spanning the period from 1993 to 2022. The median score for clarity of presentation was 71.8% (ranging from 15.3% to 91.7%), indicating generally good clarity. However, the aspect of editorial independence received poor ratings, with a median score of 32.1% (ranging from 0% to 83.3%). Notably, the 2016 guideline published by the American Epilepsy Society in Epilepsy (AES) received the highest overall scores. Across these guidance documents, there was consistency in the definition and diagnosis of SE. However, significant variability was observed in therapeutic recommendations, particularly in terms of the timing for adding or changing medications. The methodological approaches used in most SE guidance documents require improvement, and the disparities in recommendations highlight existing gaps in evidence. Enhanced methodological rigor results in increased standardization of the guideline, consequently augmenting its reference value. Given the urgency of SE as an emergency condition, it is imperative that these documents also address relevant management strategies before admission.


Assuntos
Epilepsia , Estado Epiléptico , Humanos , Consenso , Hospitalização , Estado Epiléptico/diagnóstico , Estado Epiléptico/terapia , Estados Unidos , Guias de Prática Clínica como Assunto
13.
Acta Neurochir Suppl ; 135: 147-155, 2023.
Artigo em Inglês | MEDLINE | ID: mdl-38153463

RESUMO

The management of Chiari 1 malformation (CM1) and Syringomyelia (Syr) has shown many changes in surgical indications and techniques over time. The dedicated neurosurgical and neurological community recently planned to analyze the state of the art and find conduct uniformity. This led to international consensus documents on diagnostic criteria and therapeutic strategies. We aimed to evaluate, in a large, monocentric surgical series of adult and children CM1 patients, if the daily clinical practice reflects the consensus documents. Our series comprises 190 pediatric and 220 adult Chiari patients submitted to surgery from 2000 to 2021. The main indications for the treatment were the presence of Syr and symptoms related to CM1. While there is great correspondence with the statements derived from the consensus documents about what to do for Syr and symptomatic CM1, the accordance is less evident in CM1 associated with craniosynostosis or hydrocephalus, especially when considering the early part of the series. However, we think that performing such studies could increase the homogeneity of surgical series, find a common way to evaluate long-term outcomes, and reinforce the comparability of different strategies adopted in different referral centers.


Assuntos
Malformação de Arnold-Chiari , Hidrocefalia , Siringomielia , Adulto , Humanos , Criança , Consenso , Malformação de Arnold-Chiari/cirurgia , Siringomielia/cirurgia
14.
Crit Rev Anal Chem ; : 1-14, 2023 Nov 07.
Artigo em Inglês | MEDLINE | ID: mdl-37934615

RESUMO

Chronological sequencing of ink strokes has been a challenge for the Forensic Document Examiners (FDE). Document forgery is a common practice and the ability to determine the order in which the primary and the subsequent strokes have been made is crucial for establishing the authenticity of a document. Lately, the prime thrust of establishing the sequence of intersection of ink lines has shifted from an optical to an analytical approach. Several studies have been reported to explore the use of spectroscopic techniques in determining the sequence of ink strokes made using gel pen inks, ball pen inks, fountain inks, printed ink, stamp inks, etc. The present study aims to study the existing trends in examining the sequence of ink strokes or crossing of lines using vibrational spectroscopic techniques viz. Infrared and Raman Spectroscopy. Several interesting inferences have been drawn, such as factors like paper type and time gap between the application of two intersecting strokes does not influence the determination of the sequence of inter-crossing strokes. A trend of using two analytical techniques viz. VSC, AFM, HPTLC, TOF-SIMS, and SEM/EDX with vibrational spectroscopic techniques have been found to provide reliable results. The study also suggests future research directions in the field, aiming to address challenges faced by the FDEs and provide accurate and reliable solutions for document examination.

15.
J Law Biosci ; 10(2): lsad025, 2023.
Artigo em Inglês | MEDLINE | ID: mdl-37901886

RESUMO

Innovations in neurotechnologies have ignited conversations about ethics around the world, with implications for researchers, policymakers, and the private sector. The human rights impacts of neurotechnologies have drawn the attention of United Nations bodies; nearly 40 states are tasked with implementing the Organization for Economic Co-operation and Development's principles for responsible innovation in neurotechnology; and the United States is considering placing export controls on brain-computer interfaces. Against this backdrop, we offer the first review and analysis of neuroethics guidance documents recently issued by prominent government, private, and academic groups, focusing on commonalities and divergences in articulated goals; envisioned roles and responsibilities of different stakeholder groups; and the suggested role of the public. Drawing on lessons from the governance of other emerging technologies, we suggest implementation and evaluation strategies to guide practitioners and policymakers in operationalizing these ethical norms in research, business, and policy settings.

16.
J Imaging ; 9(10)2023 Oct 20.
Artigo em Inglês | MEDLINE | ID: mdl-37888337

RESUMO

X-ray Computed Tomography (CT), a commonly used technique in a wide variety of research fields, nowadays represents a unique and powerful procedure to discover, reveal and preserve a fundamental part of our patrimony: ancient handwritten documents. For modern and well-preserved ones, traditional document scanning systems are suitable for their correct digitization, and, consequently, for their preservation; however, the digitization of ancient, fragile and damaged manuscripts is still a formidable challenge for conservators. The X-ray tomographic approach has already proven its effectiveness in data acquisition, but the algorithmic steps from tomographic images to real page-by-page extraction and reading are still a difficult undertaking. In this work, we propose a new procedure for the segmentation of single pages from the 3D tomographic data of closed historical manuscripts, based on geometric features and flood fill methods. The achieved results prove the capability of the methodology in segmenting the different pages recorded starting from the whole CT acquired volume.

17.
J Pharm Biomed Anal ; 236: 115751, 2023 Nov 30.
Artigo em Inglês | MEDLINE | ID: mdl-37778202

RESUMO

Liposomes are nano-sized lipid-based vesicles widely studied for their drug delivery capabilities. Compared to standard carries they exhibit better properties such as improved site-targeting and drug release, protection of drugs from degradation and clearance, and lower toxic side effects. At present, scientific literature is rich of studies regarding liposomes-based systems, while 14 types of liposomal products have been authorized to the market by EMA and FDA and many others have been approved by national agencies. Although the interest in nanodevices and nanomedicine has steadily increased in the last two decades the development of documentation regulating and standardizing all the phases of their development and quality control still suffers from major inadequacy due to the intrinsic complexity of nano-systems characterization. Many generic documents (Type 1) discussing guidelines for the study of nano-systems (lipidic and not) have been proposed while there is a lack of robust and standardized methods (Type 2 documents). As a result, a widespread of different techniques, approaches and methodologies are being used, generating results of variable quality and hard to compare with each other. Additionally, such documents are often subject to updates and rewriting further complicating the topic. Within this context the aim of this work is focused on bridging the gap in liposome characterization: the most recent standardized methodologies suitable for liposomes characterization are here reported (with the corresponding Type 2 documents) and revised in a short and pragmatical way focused on providing the reader with a practical background of the state of the art. In particular, this paper will put the accent on the methodologies developed to evaluate the main critical quality attributes (CQAs) necessary for liposomes market approval.


Assuntos
Sistemas de Liberação de Medicamentos , Lipossomos , Sistemas de Liberação de Medicamentos/métodos , Liberação Controlada de Fármacos
18.
Stud Health Technol Inform ; 307: 172-179, 2023 Sep 12.
Artigo em Inglês | MEDLINE | ID: mdl-37697851

RESUMO

The task of automatically analyzing the textual content of documents faces a number of challenges in general but even more so when dealing with the medical domain. Here, we can't normally rely on specifically pre-trained NLP models or even, due to data privacy reasons, (massive) amounts of training material to generate said models. We, therefore, propose a method that utilizes general-purpose basic text analysis components and state-of-the-art transformer models to represent a corpus of documents as multiple graphs, wherein important conceptually related phrases from documents constitute the nodes and their semantic relation form the edges. This method could serve as a basis for several explorative procedures and is able to draw on a plethora of publicly available resources. We test it by comparing the effectiveness of these so-called Concept Graphs with another recently suggested approach for a common use case in information retrieval, document clustering.


Assuntos
Fontes de Energia Elétrica , Armazenamento e Recuperação da Informação , Análise por Conglomerados , Privacidade , Semântica
19.
Neural Netw ; 167: 865-874, 2023 Oct.
Artigo em Inglês | MEDLINE | ID: mdl-37741068

RESUMO

In this paper, we propose a novel deep neural model for Mathematical Expression Recognition (MER). The proposed model uses encoder-decoder transformer architecture that is supported by additional pre/post-processing modules, to recognize the image of mathematical formula and convert it to a well-formed language. A novel pre-processing module based on domain prior knowledge is proposed to generate random pads around the formula's image to create more efficient feature maps and keeps all the encoder neurons active during the training process. Also, a new post-processing module is developed which uses a sliding window to extract additional position-based information from the feature map, that is proved to be useful in the recognition process. The recurrent decoder module uses the combination of feature maps and the additional position-based information, which takes advantage of a soft attention mechanism, to extract the formula context into the LaTeX well-formed language. Finally, a novel Reinforcement Learning (RL) module processes the decoder output and tunes its results by sending proper feedbacks to the previous steps. The experimental results on im2latex-100k benchmark dataset indicate that each devised pre/post-processing as well as the RL refinement module has a positive effect on the performance of the proposed model. The results also demonstrate the higher accuracy of the proposed model compared to the state-of-the-art methods.


Assuntos
Aprendizagem , Reconhecimento Psicológico , Neurônios , Benchmarking , Conhecimento , Processamento de Imagem Assistida por Computador
20.
J Psycholinguist Res ; 52(6): 2429-2451, 2023 Dec.
Artigo em Inglês | MEDLINE | ID: mdl-37646884

RESUMO

This study investigated challenges that Saudi undergraduate learners face in translating oil contracts from English into Arabic. The study used a quantitative approach of data collection. The sample of this study consisted of 18 Saudi undergraduate learners of transaction departments at some Saudi universities. To achieve the objectives of this study, a test was designed and administered. Additionally, the relevant theoretical framework of legal translation and features of oil contracts were analysed to pinpoint the problematic areas and the gaps. The results of the study indicated that undergraduate learners provided unacceptable translation based on lexical and textual features. On the other hand, they provided poor translation based on syntactic features. The study suggests some practical solutions to overcome the difficulties of legal texts for translators.


Assuntos
Projetos de Pesquisa , Tradução , Humanos , Inquéritos e Questionários , Estudantes
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA
...